skip to main content


Search for: All records

Creators/Authors contains: "Stephens, Lynn"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Interpreting and creating computational systems models is an important goal of science education. One aspect of computational systems modeling that is supported by modeling, systems thinking, and computational thinking literature is “testing, evaluating, and debugging models.” Through testing and debugging, students can identify aspects of their models that either do not match external data or conflict with their conceptual understandings of a phenomenon. This disconnect encourages students to make model revisions, which in turn deepens their conceptual understanding of a phenomenon. Given that many students find testing and debugging challenging, we set out to investigate the various testing and debugging behaviors and behavioral patterns that students use when building and revising computational system models in a supportive learning environment. We designed and implemented a 6-week unit where students constructed and revised a computational systems model of evaporative cooling using SageModeler software. Our results suggest that despite being in a common classroom, the three groups of students in this study all utilized different testing and debugging behavioral patterns. Group 1 focused on using external peer feedback to identify flaws in their model, group 2 used verbal and written discourse to critique their model’s structure and suggest structural changes, and group 3 relied on systemic analysis of model output to drive model revisions. These results suggest that multiple aspects of the learning environment are necessary to enable students to take these different approaches to testing and debugging.

     
    more » « less
  2. Abstract

    Developing and using models to make sense of phenomena or to design solutions to problems is a key science and engineering practice. Classroom use of technology-based tools can promote the development of students’ modelling practice, systems thinking, and causal reasoning by providing opportunities to develop and use models to explore phenomena. In previous work, we presented four aspects of system modelling that emerged during our development and initial testing of an online system modelling tool. In this study, we provide an in-depth examination and detailed evidence of 10th grade students engaging in those four aspects during a classroom enactment of a system modelling unit. We look at the choices students made when constructing their models, whether they described evidence and reasoning for those choices, and whether they described the behavior of their models in connection with model usefulness in explaining and making predictions about the phenomena of interest. We conclude with a set of recommendations for designing curricular materials that leverage digital tools to facilitate the iterative constructing, using, evaluating, and revising of models.

     
    more » « less